161 research outputs found

    Use of Dynamic Models and Operational Architecture to Solve Complex Navy Challenges

    Get PDF
    The United States Navy established 8 Maritime Operations Centers (MOC) to enhance the command and control of forces at the operational level of warfare. Each MOC is a headquarters manned by qualified joint operational-level staffs, and enabled by globally interoperable C41 systems. To assess and refine MOC staffing, equipment, and schedules, a dynamic software model was developed. The model leverages pre-existing operational process architecture, joint military task lists that define activities and their precedence relations, as well as Navy documents that specify manning and roles per activity. The software model serves as a "computational wind-tunnel" in which to test a MOC on a mission, and to refine its structure, staffing, processes, and schedules. More generally, the model supports resource allocation decisions concerning Doctrine, Organization, Training, Material, Leadership, Personnel and Facilities (DOTMLPF) at MOCs around the world. A rapid prototype effort efficiently produced this software in less than five months, using an integrated process team consisting of MOC military and civilian staff, modeling experts, and software developers. The work reported here was conducted for Commander, United States Fleet Forces Command in Norfolk, Virginia, code N5-0LW (Operational Level of War) that facilitates the identification, consolidation, and prioritization of MOC capabilities requirements, and implementation and delivery of MOC solutions

    Energy-Filtering Transmission Electron Microscopy of Biological Specimens

    Get PDF
    By energy-filtering transmission electron microscopy (EFTEM) electrons can be separated by their energy losses. An electron-energy filter, added to the microscope column allows the measurement of the energy distribution of transmitted electrons that have lost energy (\u3c 2,000 eV, with an energy resolution of ~ 1 eV). These filtered electrons, recorded either as a spectrum or as an image, are composed of two parts superimposed on top of each other: (a) the unspecific energy-loss population (= the continuum) and (b) the specific element-related energy-loss population (= the edges). At the edges, electron data in spectra and images are mathematically processed, to obtain the desired element-related net-intensity values or images. These data are related to the total transmitted electron intensity, from the zero-and low-loss spectral region giving the relative spectral-or image intensity ratios (SR*x, IR*x), which can be related to the element concentration. The acquisition of the zero-loss and low-loss data is hampered by the restricted dynamic range of the TV camera. By improvements through the introduction of calibrated attenuation filters in the optical path to the TV-camera, more reliable values for SR*x and IR*x can be acquired. By addition of Bio-standards adjacent to the tissue, a known and unknown concentration of the element present in the same ultrathin section and the bias in the concentration estimation, can be obtained. Some practical examples are given for the estimation of the iron concentration in siderosomes, boron in melanosomes and calcium in calcium oxalate monohydrate crystals

    An optimally concentrated Gabor transform for localized time-frequency components

    Full text link
    Gabor analysis is one of the most common instances of time-frequency signal analysis. Choosing a suitable window for the Gabor transform of a signal is often a challenge for practical applications, in particular in audio signal processing. Many time-frequency (TF) patterns of different shapes may be present in a signal and they can not all be sparsely represented in the same spectrogram. We propose several algorithms, which provide optimal windows for a user-selected TF pattern with respect to different concentration criteria. We base our optimization algorithm on lpl^p-norms as measure of TF spreading. For a given number of sampling points in the TF plane we also propose optimal lattices to be used with the obtained windows. We illustrate the potentiality of the method on selected numerical examples

    The KELT Follow-Up Network And Transit False-Positive Catalog: Pre-Vetted False Positives For TESS

    Get PDF
    The Kilodegree Extremely Little Telescope (KELT) project has been conducting a photometric survey of transiting planets orbiting bright stars for over 10 years. The KELT images have a pixel scale of ~23\u27\u27 pixel⁻¹—very similar to that of NASA\u27s Transiting Exoplanet Survey Satellite (TESS)—as well as a large point-spread function, and the KELT reduction pipeline uses a weighted photometric aperture with radius 3\u27. At this angular scale, multiple stars are typically blended in the photometric apertures. In order to identify false positives and confirm transiting exoplanets, we have assembled a follow-up network (KELT-FUN) to conduct imaging with spatial resolution, cadence, and photometric precision higher than the KELT telescopes, as well as spectroscopic observations of the candidate host stars. The KELT-FUN team has followed-up over 1600 planet candidates since 2011, resulting in more than 20 planet discoveries. Excluding ~450 false alarms of non-astrophysical origin (i.e., instrumental noise or systematics), we present an all-sky catalog of the 1128 bright stars (6 \u3c V \u3c 13) that show transit-like features in the KELT light curves, but which were subsequently determined to be astrophysical false positives (FPs) after photometric and/or spectroscopic follow-up observations. The KELT-FUN team continues to pursue KELT and other planet candidates and will eventually follow up certain classes of TESS candidates. The KELT FP catalog will help minimize the duplication of follow-up observations by current and future transit surveys such as TESS

    Active Singularities for Multivehicle Motion Planning in an N-Vortex System

    Full text link
    Abstract. This paper presents a path-planning paradigm for distributed control of multiple sensor platforms in a geophysical flow well-approximated by a point-vortex model. We utilize Hamiltonian dynamics to generate control vector fields for vehicle motion in N-vortex flows using the con-cept of an active singularity whose strength is a tunable control input. We introduce active singularities that are virtual point vortices possibly collocated with virtual point sources or sinks. We provide a principled method to stabilize relative equilibria of these virtual vortices in the presence of the actual point vortices, which represent the underlying geo-physical flow. We illustrate how these relative equilibria may be useful for vehicle path planning and sampling in a geophysical flow. Preliminary results presented here are based on an adaptive control design

    ReCombine: A Suite of Programs for Detection and Analysis of Meiotic Recombination in Whole-Genome Datasets

    Get PDF
    In meiosis, the exchange of DNA between chromosomes by homologous recombination is a critical step that ensures proper chromosome segregation and increases genetic diversity. Products of recombination include reciprocal exchanges, known as crossovers, and non-reciprocal gene conversions or non-crossovers. The mechanisms underlying meiotic recombination remain elusive, largely because of the difficulty of analyzing large numbers of recombination events by traditional genetic methods. These traditional methods are increasingly being superseded by high-throughput techniques capable of surveying meiotic recombination on a genome-wide basis. Next-generation sequencing or microarray hybridization is used to genotype thousands of polymorphic markers in the progeny of hybrid yeast strains. New computational tools are needed to perform this genotyping and to find and analyze recombination events. We have developed a suite of programs, ReCombine, for using short sequence reads from next-generation sequencing experiments to genotype yeast meiotic progeny. Upon genotyping, the program CrossOver, a component of ReCombine, then detects recombination products and classifies them into categories based on the features found at each location and their distribution among the various chromatids. CrossOver is also capable of analyzing segregation data from microarray experiments or other sources. This package of programs is designed to allow even researchers without computational expertise to use high-throughput, whole-genome methods to study the molecular mechanisms of meiotic recombination
    corecore